Standardized coefficient

In statistics, standardized coefficients or beta coefficients are the estimates resulting from an analysis carried out on variables that have been standardized so that their variances are 1. Therefore, standardized coefficients refer to how many standard deviations a dependent variable will change, per standard deviation increase in the predictor variable. Standardization of the coefficient is usually done to answer the question of which of the independent variables have a greater effect on the dependent variable in a multiple regression analysis, when the variables are measured in different units of measurement (for example, income measured in dollars and family size measured in number of individuals).

Some statistical software packages like PSPP, SPSS and SYSTAT label the standardized regression coefficients as "Beta" while the unstandarized coefficients are labeled "B". Others, like DAP/SAS label them "Standardized Coefficient". Sometimes the unstandardized variables are also labeled as "B" or "b".

A regression carried out on original (unstandardized) variables produces unstandardized coefficients. A regression carried out on standardized variables produces standardized coefficients. Both standardized and unstandardized coefficients are also possible to estimate from the original variables.

Before solving a multiple regression equation, all variables (independent and dependent) can be standardized. Each variable can be standardized by subtracting its mean from each of its values and then dividing these new values by the standard deviation of the variable. Standardizing all variables in a multiple regression yields standardized regression coefficients that show the change in the dependent variable measured in standard deviations.

Advantages
Standard coefficients' advocates note that the coefficients ignore the independent variable's scale of units, which makes comparisons easy.
Disadvantages
Critics voice concerns that such a standardization can be misleading; a change of one standard deviation in one variable has no reason to be equivalent to a similar change in another predictor.

Some variables are easy to affect externally, e.g., the amount of time spent on an action. Weight or cholesterol level are more difficult, and some, like height or age, are impossible to affect externally.

References

External links